Tongyi Deep Research 30B is a large language model with 30 billion parameters, designed specifically for long-cycle, deep information search tasks. This model performs excellently in multiple intelligent search benchmark tests, uses innovative quantization methods to improve performance, and supports intelligent pre-training, supervised fine-tuning, and reinforcement learning.
Natural Language Processing
TransformersEnglish